Learning to Match Images in Large-Scale Collections
نویسندگان
چکیده
Many computer vision applications require computing structure and feature correspondence across a large, unorganized image collection. This is a computationally expensive process, because the graph of matching image pairs is unknown in advance, and so methods for quickly and accurately predicting which of the O(n) pairs of images match are critical. Image comparison methods such as bag-of-words models or global features are often used to predict similar pairs, but can be very noisy. In this paper, we propose a new image matching method that uses discriminative learning techniques—applied to training data gathered automatically during the image matching process—to gradually compute a better similarity measure for predicting whether two images in a given collection overlap. By using such a learned similarity measure, our algorithm can select image pairs that are more likely to match for performing further feature matching and geometric verification, improving the overall efficiency of the matching process. Our approach processes a set of images in an iterative manner, alternately performing pairwise feature matching and learning an improved similarity measure. Our experiments show that our learned measures can significantly improve match prediction over the standard tf-idf-weighted similarity and more recent unsupervised techniques even with small amounts of training data, and can improve the overall speed of the image matching process by more than a factor of two.
منابع مشابه
Contributions to large-scale learning for image classification
Building algorithms that classify images on a large scale is an essential task due to the difficulty in searching massive amount of unlabeled visual data available on the Internet. We aim at classifying images based on their content to simplify the manageability of such large-scale collections. Large-scale image classification is a difficult problem as datasets are large with respect to both th...
متن کاملمرور مؤثر نتایج جستجوی تصاویر با تلخیص بصری و متنوع از طریق خوشهبندی
With unprecedented growth in production of digital images and use of multimedia references, requirement of image and subject search has been increased. Systematic processing of this information is a basic prerequisite for effective analysis, organization and management of it. Likewise, large collections of images have been made available on the Web and many search engines have provided the poss...
متن کاملJustClick: Personalized Image Recommendation via Exploratory Search from Large-Scale Flickr Image Collections
In this paper, we have developed a novel framework called JustClick to enable personalized image recommendation via exploratory search from large-scale collections of manuallyannotated Flickr images. First, a topic network is automatically generated to summarize large-scale collections of manuallyannotated Flickr images at a semantic level. Hyperbolic visualization is further used to enable int...
متن کاملMatch Graph Construction for Large Image Databases
How best to efficiently establish correspondence among a large set of images or video frames is an interesting unanswered question. For large databases, the high computational cost of performing pair-wise image matching is a major problem. However, for many applications, images are inherently sparsely connected, and so current techniques try to correctly estimate small potentially matching subs...
متن کاملMachine Learning for Visual Concept Recognition and Ranking for Images
Recognition of a large set of generic visual concepts in Images and Ranking of Images based on visual semantics is one of the unsolved tasks for future multimedia and scientific applications based on image collections. From that perspective improvements of the quality of semantic annotations for image data are well matched to the goals of the THESEUS project with respect to multimedia and scien...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2012